9 - Komplexität von Algorithmen [ID:10702]
50 von 443 angezeigt

We start today's lecture. We will continue with what we've seen in the last two or three

lectures which is related with average complexities. So you've seen so far a couple of examples.

One was adding one to a counter. In the worst case, it's linear in the number of digits,

but since half of the numbers are even and have a zero in the last position and half

of the remaining ones have a zero in the second position, et cetera, on average it's a fast

operation. It takes constant time. You've also seen the example of looking for maximum

in an array and looking how many times we change candidates. And the last example that

was considered last time was what happens on average on a linear search. We will come

to this example now. One thing that could happen is that one may wonder, okay, why would

one care about a linear search? If we have an array, we can just sort the elements in

the order we want and then perform a binary search. So why would we care about the complexity

on average of a linear search? Well, the answer is it's not always integers, the things that

we can put in our arrays and there are many algorithms that in the end are equivalent

to a linear search. So for instance, as a motivational example, consider the case where

we have some we want to do some image processing. We receive images from a satellite or whatever

and we want to see if we find aliens on the pictures or faces or whatever, count rocks,

and we have different kinds of image processing algorithms. Some work with certain kind of

pictures, those that are darker or lighter or more blurry, less blurry or detect certain

kind of alien faces or whatever. So when they give us a picture, we try it with one algorithm

after the other until see if some of them detects an alien or whatever. So this is a

case of a linear search. We are trying one case after the other and there's nothing too

short here. We cannot impose an order on these algorithms. So we can keep this as a motivational

example. The only assumption we will have to do is that for this image processing thing,

for each image, only one of the algorithms will return us something. So this will, it's

an assumption that simplifies everything. So given this, given a setup like this, we

will receive images, things we need to look to perform a linear search and in the worst

case we know that it will take us n steps where n is the length of our array, the number

of image processing algorithms we know, but we somehow would like to optimize for the

average case. If we know that we are receiving, it's more likely that we will receive images

of a certain type which are handled better by some type of algorithm, etc. What can we

do to make things better? And this is what was done at the end of the last lecture. The

idea is the following, so let's take some notation. So the array elements are 1 to n.

So we have an array of size n, we count from 1, etc. So here the elements could be the

image processing algorithms or whatever. And then we have a permutation L that tells us

each of these elements that need to go in the array in which position of the array there

will be. Element 1 is in position 5, etc. And the other element we need is P sub i,

is the probability

that element i is searched. This is our notation. And what was seen last time by the end of

the lecture was the following. So here I'm just doing a recap. An arrangement or permutation,

L is optimal with respect to WRT, with respect to the expected search cost.

If it satisfies that, if the probability of i is greater than the probability of j, then

L of i is smaller than L of j. So what this says is in less mathematical terms is that

the best thing you can do to get the optimal, the better complexity in average is put the

elements that are looked more often at the front. So thanks Einstein, you can say I wouldn't

have imagined. It's pretty reasonable. We actually approved that this is the case. And

the second thing that we, okay, this is reasonable. Yes, an arrangement L is optimal arrangement,

this permutation here, arrangement is where we put the element in the array, is optimal

with respect to the expected search cost. So expected search cost is on average how expensive

will be to search an element doing a linear search if it satisfies that when the probability

of looking for i is greater than the probability of looking for j, that is we look for, it's

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

01:34:00 Min

Aufnahmedatum

2013-05-17

Hochgeladen am

2019-04-22 23:49:03

Sprache

de-DE

  • Mathematische Hilfsmittel der Algorithmenanalyse: Abschätzung des asymptotischen Wachstums von Funktionen, Summationen, Anzahlen, divide-and-conquer-Rekursionen, etc.
  • Grundbegriffe der quantitativen Algorithmenanalyse: worst-case- und average-case-Analsyse, obere und untere Schranken, Algorithmen- und Problemkomplexität

  • Exemplarische Analysen von Sortieralgorithmen

  • Sortierkomplexität und Entropie

  • Quellcodierung und Datenkompression

  • Komplexität von arithmetischen Operationen und Problemen (Multiplikation, Primtest, Faktorisierung)

  • modulare Arithmetik und schnelle Fouriertransformation

  • Kryptographie und Komplexität

Lernziele und Kompetenzen:

Die Studierenden

  • erwerben fundierte Kenntnisse über die Grundbegriffe der quantitativen Algorithmenanalyse (Laufzeit) und die benötigten mathematischen Methoden

  • verstehen die Komplexität (Laufzeitverhalten) von Standardalgorithmen (z.B. Sortieren, arithmetische Algorithmen) und können deren praktische Bedeutung erklären

  • sind in der Lage, an einfachen, exemplarischen Algorithmen Analysen des worst-case-Verhaltens und des average-case-Verhaltens durchzuführen

  • können exemplarisch Algorithmenkomplexität und Problemkomplexität in Bezug setzen

  • können die Beziehungen zwischen Sortier- und Suchkomplexität und dem Entropiebegriff darstellen

  • erwerben Grundkenntnisse über algebraische Strukturen der Arithmetik und die Komplexität arithmetischer Operationen

  • können die Rolle von Komplexitätsaussagen für die Beurteilung der Sicherheit einfacher kryptografischer Protokoll darstellen

Literatur:

Graham, Knuth, Patashnik, Concrete Mathematics, Addison-Wesley, 1994.
Cormen, Leiserson, Rivest, Stein, Introduction to Algorithms, MIT-Press, 2001.
Heun, Grundlegende Algorithmen, Vieweg, 2001.

Einbetten
Wordpress FAU Plugin
iFrame
Teilen